Tree Pruning for Output Coded Ensembles

نویسندگان

  • Terry Windeatt
  • Gholamreza Ardeshir
چکیده

Output Coding is a method of converting a multiclass problem into several binary subproblems and gives an ensemble of binary classifiers. Like other ensemble methods, its performance depends on the accuracy and diversity of base classifiers. If a decision tree is chosen as base classifier, the issue of tree pruning needs to be addressed. In this paper we investigate the effect of six methods of pruning on ensembles of trees generated by Error-Correcting Output Code (ECOC). Our results show that Error-Based Pruning outperforms on most datasets but it is better not to prune than to select a single pruning strategy for all datasets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boosting Lazy Decision Trees

This paper explores the problem of how to construct lazy decision tree ensembles. We present and empirically evaluate a relevancebased boosting-style algorithm that builds a lazy decision tree ensemble customized for each test instance. From the experimental results, we conclude that our boosting-style algorithm significantly improves the performance of the base learner. An empirical comparison...

متن کامل

Achieving Single-User Performance in a FEC-coded DS-CDMA system using the Low-Complexity Soft-Output M-Algorithm

This paper presents a soft-output version of the Malgorithm (SOMA) to reduce the complexity of a turbo multiuser detector in a DS-CDMA system employing a forward error correction code. The SOMA reduces the complexity by pruning an equivalent tree that represents all possible transmit signalvectors. It utilizes paths that are traversed but discarded in a pruned tree to reduce the total number of...

متن کامل

New Ordering-Based Pruning Metrics for Ensembles of Classifiers in Imbalanced Datasets

The task of classification with imbalanced datasets have attracted quite interest from researchers in the last years. The reason behind this fact is that many applications and real problems present this feature, causing standard learning algorithms not reaching the expected performance. Accordingly, many approaches have been designed to address this problem from different perspectives, i.e., da...

متن کامل

Optimally Pruning Decision Tree Ensembles With Feature Cost

We consider the problem of learning decision rules for prediction with feature budget constraint. In particular, we are interested in pruning an ensemble of decision trees to reduce expected feature cost while maintaining high prediction accuracy for any test example. We propose a novel 0-1 integer program formulation for ensemble pruning. Our pruning formulation is general it takes any ensembl...

متن کامل

Effective Pruning of Neural Network Classifier Ensembles

Neural network ensemble techniques have been shown to be very accurate classification techniques. However, in some real-life applications a number of classifiers required to achieve a reasonable accuracy is enormously large and hence very space consuming. This paper proposes several methods for pruning neural network ensembles. The clustering based approach applies k-means clustering to entire ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002